Goto

Collaborating Authors

 task performance











Appendix A Distribution of Class Labels Across Each Probing Task

Neural Information Processing Systems

We also implemented the Iterative Null-Space Projection (INLP) method (Ravfogel et al., 2020) to Results using our method are in Table 4. Results using the INLP method are This pattern holds across all of the linguistic properties that we tested. Each language brain region is not necessarily homogeneous in function across all voxels it contains. Bottom plot displays the pretrained BERT vs. removal of all tasks. Like the probing experiments with BERT in the main paper, we also perform experiments with GPT2. We find the results to be similar to BERT, i.e., a rich hierarchy of linguistic signals: initial to middle layers encode surface information, middle layers encode syntax, middle to top layers We verify that the removal of each linguistic property from GPT2 leads to reduced task performance across all layers, as expected.